Eigenspace-based Linear Transformation Approach for Rapid Speaker Adaptation

نویسندگان

  • Kuan-Ting Chen
  • Hsin-Min Wang
چکیده

This paper presents our recent effort on the development of the eigenspace-based linear transformation approach for rapid speaker adaptation. The proposed approach toward prior density selection for the MAPLR framework was developed by introducing a priori knowledge analysis on the training speakers via probabilistic principal component analysis (PPCA), so as to construct an eigenspace for speaker-specific full regression matrices as well as to derive a set of bases called eigen-transformations. The prior densities of MAPLR transformations for each outside speaker are then chosen in the space spanned by the first few eigen-transformations. By incorporating the PPCA model of transformation parameters into the MAPLR scheme, the number of free parameters can be significantly reduced, while the underlying structure of the acoustic space as well as the precise modeling of the inter-dimensional correlation among the model parameters can be well preserved. Rapid supervised adaptation experiments showed that the proposed approach not only is superior to the conventional MLLR approach using either diagonal or block-diagonal regression matrices, but also outperformed by a great amount the full-matrix MLLR with either a global transformation or multiple transformations corresponding to different phonetic classes.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast speaker adaptation using eigenspace-based maximum likelihood linear regression

This paper presents an eigenspace-based fast speaker adaptation approach which can improve the modeling accuracy of the conventional maximum likelihood linear regression (MLLR) techniques when only very limited adaptation data is available. The proposed eigenspace-based MLLR approach was developed by introducing a priori knowledge analysis on the training speakers via PCA, so as to construct an...

متن کامل

Eigenspace-based speaker adaptation methods in Persian speech recognition systems

Among speaker adaptation algorithms, eigenvoice (EV) and eigenspace-based MLLR (EMLLR) adaptation approaches have been proposed for rapid adaptation with very limited adaptation data. In these methods, a speaker adapted model is constrained to be a weighted combination of some orthogonal basis vectors. In this manner, both the number of parameters to be estimated from the adaptation data, and t...

متن کامل

Eigenspace-based maximum a posteriori linear regression for rapid speaker adaptation

In this paper, we present an eigenspace-based approach toward prior density selection for the MAPLR framework. The proposed eigenspace-based MAPLR approach was developed by introducing a priori knowledge analysis on the training speakers via probabilistic principal component analysis (PPCA), so as to construct an eigenspace for speaker-specific full regression matrices as well as to derive a se...

متن کامل

Improving eigenspace-based MLLR adaptation by kernel PCA

Eigenspace-based MLLR (EMLLR) adaptation has been shown effective for fast speaker adaptation. It applies the basic idea of eigenvoice adaptation, and derives a small set of eigenmatrices using principal component analysis (PCA). The MLLR adaptation transformation of a new speaker is then a linear combination of the eigenmatrices. In this paper, we investigate the use of kernel PCA to find the ...

متن کامل

Robust Speaker Clustering in Eigenspace

In this paper we propose a speaker clustering scheme working in ’Eigenspace’. Speaker models are transformed to a low-dimensional subspace using ’Eigenvoices’. For the speaker clustering procedure simple distance measures, e.g. Euklidean distance can be applied. Moreover, clustering can be accomplished with base models (for Eigenvoice projection) like Gaussian Mixture Models as well as conventi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001